- gradient optimization
- Макаров: градиентная оптимизация, оптимизация градиентным методом
Универсальный англо-русский словарь. Академик.ру. 2011.
Универсальный англо-русский словарь. Академик.ру. 2011.
Optimization (mathematics) — In mathematics, the term optimization, or mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an… … Wikipedia
Gradient descent — For the analytical method called steepest descent see Method of steepest descent. Gradient descent is an optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the… … Wikipedia
Mathematical optimization — For other uses, see Optimization (disambiguation). The maximum of a paraboloid (red dot) In mathematics, computational science, or management science, mathematical optimization (alternatively, optimization or mathematical programming) refers to… … Wikipedia
Multidisciplinary design optimization — Multi disciplinary design optimization (MDO) is a field of engineering that uses optimization methods to solve design problems incorporating a number of disciplines. As defined by Prof. Carlo Poloni, MDO is the art of finding the best compromise … Wikipedia
Newton's method in optimization — A comparison of gradient descent (green) and Newton s method (red) for minimizing a function (with small step sizes). Newton s method uses curvature information to take a more direct route. In mathematics, Newton s method is an iterative method… … Wikipedia
Shape optimization — is part of the field of optimal control theory. The typical problem is to find the shape which is optimal in that it minimizes a certain cost functional while satisfying given constraints. In many cases, the functional being solved depends on the … Wikipedia
Nonlinear conjugate gradient method — In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function : The minimum of f is obtained when the gradient is 0: . Whereas linear conjugate… … Wikipedia
Combinatorial optimization — In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects.[1] In many such problems, exhaustive search is not feasible. It operates on… … Wikipedia
Convex optimization — Convex minimization, a subfield of optimization, studies the problem of minimizing convex functions over convex sets. Given a real vector space X together with a convex, real valued function defined on a convex subset of X, the problem is to find … Wikipedia
Algorithme du gradient — L algorithme du gradient désigne un algorithme d optimisation différentiable. Il est par conséquent destiné à minimiser une fonction réelle différentiable définie sur un espace euclidien (par exemple, , l espace des n uplets de nombres réels,… … Wikipédia en Français
Ant Colony Optimization — Algorithme de colonies de fourmis Les algorithmes de colonies de fourmis sont des algorithmes inspirés du comportement des fourmis et qui constituent une famille de métaheuristiques d’optimisation. Initialement proposé par Marco Dorigo et al.… … Wikipédia en Français